Instalação do Keras
$ sudo pip install keras
Para salvar os modelos do Keras
$ sudo pip install h5py
Entrar no Virtual Environment
$ cd deep-learning-theano
$ source bin/activate
Para usar o Theano em Background
$ KERAS_BACKEND=theano jupyter notebook
Para usar o Tensor Flow em Background
$ KERAS_BACKEND=tensorflow jupyter notebook
Para gerar os gráficos da rede, basta executar os seguintes comandos no terminal
$ pip install graphviz
$ brew install graphviz
$ pip install pydot
In [1]:
# Load libraries
import keras
import os
import numpy
import pydot
import graphviz
from keras.models import Sequential
from keras.layers import Dense
from keras.models import model_from_json
from keras.models import model_from_yaml
In [2]:
# Set seed
numpy.random.seed(12345)
In [3]:
# Load dataset
dataset = numpy.loadtxt("pima-indians-diabetes.csv", delimiter=",")
In [4]:
# Independent Variables
X = dataset[:,0:8]
# Dependent Variables
Y = dataset[:,8]
In [5]:
Y
Out[5]:
In [6]:
# First we'll create a sequential model where we'll put some layers in our network topology
model = Sequential()
'''
After, we'll put in `input_dim` the same number of variables of our dataset
The 'Dense' class indicates that our network is fully-connected, and the first parameter is the number of neurons
The activation function `relu` it's rectifier activation function (f(x)=max(0,x) as x is the neuron). Details
can be found in R Hahnloser, R. Sarpeshkar, M A Mahowald, R. J. Douglas, H.S. Seung (2000).
Digital selection and analogue amplification coexist in a cortex-inspired silicon circuit. Nature. 405. pp. 947–951.
The best way to put some parameters is using the following structure: # neurons, initialization method using `init`
, and `activation` argument.
'''
model.add(Dense(12, input_dim=8, activation='relu')) # First hidden layer
model.add(Dense(8, kernel_initializer = 'uniform', activation='relu')) # Second hidden layer
model.add(Dense(10, kernel_initializer = 'uniform', activation='sigmoid')) # Third hidden layer
model.add(Dense(50, kernel_initializer = 'normal', activation='sigmoid')) # Fourth hidden layer
model.add(Dense(24, kernel_initializer = 'normal', activation='sigmoid')) # Fifth hidden layer
# model.add(Dense(300, kernel_initializer = 'normal', activation='sigmoid')) # Sixth hidden layer
# model.add(Dense(200, kernel_initializer = 'uniform', activation='softmax')) # Seventh hidden layer
# model.add(Dense(100, kernel_initializer = 'normal', activation='softmax')) # Eighth hidden layer
# model.add(Dense(50, kernel_initializer = 'uniform', activation='relu')) # Nineth hidden layer
# model.add(Dense(150, kernel_initializer = 'normal', activation='relu')) # tenth hidden layer
model.add(Dense(1, activation='sigmoid')) # Output layer (0 or 1) We use sigmoid to catch 0 or 1 values
In [7]:
# To compile model we need to set 1) Loss Function, 2) Optimizer, and 3) Metrics to get the quality of the model
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
In [8]:
# Fit the model
model.fit(X, Y, epochs=750, batch_size=30, verbose=0)
Out[8]:
In [9]:
# Evaluate to see how good the model is
scores = model.evaluate(X, Y)
print("\n%s: %.2f%%" % (model.metrics_names[1], scores[1]*100))
In [10]:
# Making predictions using X dataset
predictions = model.predict(X)
In [11]:
# Round predictions because, our dataset have only 0 and 1 units
rounded = [round(x[0]) for x in predictions]
# Print output
print(rounded)
In [12]:
# serialize model to JSON
model_json = model.to_json()
with open("model.json", "w") as json_file:
json_file.write(model_json)
# serialize weights to HDF5
model.save_weights("model.h5")
print("Saved model and weights to disk in jSON and HDF5 formats")
In [13]:
# load json and create model
json_file = open('model.json', 'r')
loaded_model_json = json_file.read()
json_file.close()
loaded_model = model_from_json(loaded_model_json)
# load weights into new model
loaded_model.load_weights("model.h5")
print("Loaded model and weights from disk")
In [14]:
# Compile and test model
loaded_model.compile(loss='binary_crossentropy', optimizer='rmsprop', metrics=['accuracy'])
# Score the model
score = loaded_model.evaluate(X, Y, verbose=0)
# Results of scoring
print("%s: %.2f%%" % (loaded_model.metrics_names[1], score[1]*100))
In [15]:
# serialize model to YAML
model_yaml = model.to_yaml()
with open("model.yaml", "w") as yaml_file:
yaml_file.write(model_yaml)
# serialize weights to HDF5
model.save_weights("model.h5")
print("Saved model and weights to disk in YAML and HDF5 formats")
In [16]:
# load YAML and create model
yaml_file = open('model.yaml', 'r')
loaded_model_yaml = yaml_file.read()
yaml_file.close()
loaded_model = model_from_yaml(loaded_model_yaml)
# load weights into new model
loaded_model.load_weights("model.h5")
print("Loaded model and weights to disk in YAML and HDF5 formats")
In [17]:
# Recompile the saved model
loaded_model.compile(loss='binary_crossentropy', optimizer='rmsprop', metrics=['accuracy'])
# Score the saved model against the test data
score = loaded_model.evaluate(X, Y, verbose=0)
# Final Score
print("%s: %.2f%%" % (loaded_model.metrics_names[1], score[1]*100))